# Semantic Relation Classification
Relbert Roberta Large
RelBERT is a model based on RoBERTa-large, specifically designed for relation embedding tasks, trained on the SemEval-2012 Task 2 dataset using NCE (Noise Contrastive Estimation).
Text Embedding
Transformers

R
relbert
97
2
Mnli 1
BERT is a pre-trained language model based on the Transformer architecture, developed by Google. This model excels in various natural language processing tasks, including text classification, question answering, and named entity recognition.
Text Classification
Transformers

M
kangnichaluo
14
0
Featured Recommended AI Models